boosted dyadic kernel discriminant
Boosted Dyadic Kernel Discriminants
We introduce a novel learning algorithm for binary classi(cid:12)cation with hyperplane discriminants based on pairs of training points from opposite classes (dyadic hypercuts). This algorithm is further extended to nonlinear discriminants using kernel functions satisfy- ing Mercer's conditions. An ensemble of simple dyadic hypercuts is learned incrementally by means of a con(cid:12)dence-rated version of Ad- aBoost, which provides a sound strategy for searching through the (cid:12)nite set of hypercut hypotheses. In experiments with real-world datasets from the UCI repository, the generalization performance of the hypercut classi(cid:12)ers was found to be comparable to that of SVMs and k-NN classi(cid:12)ers. Furthermore, the computational cost of classi(cid:12)cation (at run time) was found to be similar to, or bet- ter than, that of SVM.